Itโs becoming familiar: get arrested, go to trial, and find out the governmentโs star witness is a half-baked algorithm built in a Silicon Valley basement, rubber-stamped by a judge who canโt program a VCR. This time itโs in New Jersey, where the prosecution in State v. Miles is taking this pastime to new levels of absurdity.
In the state’s latest criminal-justice-meets-cyberpunk farce, Tybear Miles stands accused of the 2021 killing of Ahmad McPherson. The central piece of evidence? Not fingerprints. Not an eyewitness. A facial recognition hit from a system so secretive, that the government wonโt even tell the defense what it is, how it works, or whether itโs more accurate than a drunk dart throw.
The prosecution insists it has Miles nailed, thanks to a confidential informant who claimed that โFat Daddyโ was the killer. The cops then poked around Instagram, pulled some photos, fed them into a facial recognition system, and out popped Tybear Miles. The algorithm gave its blessing, the informant nodded in agreement, and the case was sealed.
Except there’s one hitch: the defense wants to actually see how the magic sausage was made. Understandable, since getting sent to prison based on a software’s hunch isn’t exactly the gold standard of due process. Defense lawyers have asked for access to the guts of the system: error rates, database quality, testing protocols, anything that might show the difference between science and snake oil.
The stateโs answer? A firm, resounding โno.โ Apparently, revealing how this software works would compromise law enforcement tactics.
Cue the cavalry. Civil liberties watchdogs whoโve seen this sci-fi courtroom rerun one too many times; filed a joint brief basically shouting: You canโt have a fair trial if the evidence comes from a black box!
โFacial recognition searches involve multiple components and steps that each introduce a significant possibility of misidentification,โ the brief warns, in what might be the most understated way to say โthis stuff screws up a lot.โ
The civil liberties gang is pointing to a recent New Jersey appellate decision in State v. Arteaga, where the court wisely concluded that if a computer is accusing someone of a crime, we might want to know whether it graduated from MIT or flunked out of Clippyโs Academy for Glitchy Algorithms.
That ruling laid the groundwork for Milesโ team to ask for the same transparency. Youโd think that would be obvious. Instead, weโre having a legal showdown to determine whether a manโs freedom hinges on trade secrets and corporate NDAs.
This isnโt only a Jersey problem. Across the country, cops have been quietly using facial recognition tech like itโs a cheat code in a video game, without bothering to tell judges, juries, or the people getting locked up because a computer said so.
In that context, the Miles case is less of an anomaly and more of a litmus test. If the New Jersey Supreme Court decides to side with secrecy, it won’t just gut one man’s defense. It’ll further enshrine the idea that algorithmic evidence is above scrutiny.